Discrete Hessian Eigenmaps method for dimensionality reduction
نویسندگان
چکیده
منابع مشابه
Discrete Hessian Eigenmaps method for dimensionality reduction
For a given set of data points lying on a low-dimensional manifold embedded in a high-dimensional space, the dimensionality reduction is to recover a low-dimensional parametrization from the data set. The recently developed Hessian Eigenmaps is a mathematically rigorous method that also sets a theoretical framework for the nonlinear dimensionality reduction problem. In this paper, we develop a ...
متن کاملContinuous nonlinear dimensionality reduction by kernel Eigenmaps
We equate nonlinear dimensionality reduction (NLDR) to graph embedding with side information about the vertices, and derive a solution to either problem in the form of a kernel-based mixture of affine maps from the ambient space to the target space. Unlike most spectral NLDR methods, the central eigenproblem can be made relatively small, and the result is a continuous mapping defined over the e...
متن کاملLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
One of the central problems in machine learning and pattern recognition is to develop appropriate representations for complex data. We consider the problem of constructing a representation for data lying on a lowdimensional manifold embedded in a high-dimensional space. Drawing on the correspondence between the graph Laplacian, the Laplace Beltrami operator on the manifold, and the connections ...
متن کاملHessian eigenmaps: locally linear embedding techniques for high-dimensional data.
We describe a method for recovering the underlying parametrization of scattered data (m(i)) lying on a manifold M embedded in high-dimensional Euclidean space. The method, Hessian-based locally linear embedding, derives from a conceptual framework of local isometry in which the manifold M, viewed as a Riemannian submanifold of the ambient Euclidean Space R(n), is locally isometric to an open, c...
متن کاملThresholding Method for Reduction of Dimensionality
Often recognition systems must be designed with a relatively small amount of training data. Plug-in test statistics suuer from large estimation errors, often causing the performance to degrade with increasing size of the measurement vector. Choosing a better test statistic or applying a method of dimensionality reduction are two possible solutions to the problem above. In this paper we consider...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Computational and Applied Mathematics
سال: 2015
ISSN: 0377-0427
DOI: 10.1016/j.cam.2014.09.011